Hilbert–Schmidt component analysis
نویسندگان
چکیده
We propose a feature extraction algorithm, based on the Hilbert–Schmidt independence criterion (HSIC) and the maximum dependence – minimum redundancy approach. Experiments with classification data sets demonstrate that suggested Hilbert–Schmidt component analysis (HSCA) algorithm in certain cases may be more efficient than other considered approaches.
منابع مشابه
Composition operators acting on weighted Hilbert spaces of analytic functions
In this paper, we considered composition operators on weighted Hilbert spaces of analytic functions and observed that a formula for the essential norm, gives a Hilbert-Schmidt characterization and characterizes the membership in Schatten-class for these operators. Also, closed range composition operators are investigated.
متن کاملG-frames and Hilbert-Schmidt operators
In this paper we introduce and study Besselian $g$-frames. We show that the kernel of associated synthesis operator for a Besselian $g$-frame is finite dimensional. We also introduce $alpha$-dual of a $g$-frame and we get some results when we use the Hilbert-Schmidt norm for the members of a $g$-frame in a finite dimensional Hilbert space.
متن کاملMeasuring Statistical Dependence with Hilbert-Schmidt Norms
We propose an independence criterion based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces (RKHSs), consisting of an empirical estimate of the Hilbert-Schmidt norm of the cross-covariance operator (we term this a Hilbert-Schmidt Independence Criterion, or HSIC). This approach has several advantages, compared with previous kernel-based independence criteria. Fir...
متن کاملA note on the Young type inequalities
In this paper, we present some refinements of the famous Young type inequality. As application of our result, we obtain some matrix inequalities for the Hilbert-Schmidt norm and the trace norm. The results obtained in this paper can be viewed as refinement of the derived results by H. Kai [Young type inequalities for matrices, J. Ea...
متن کاملKernel Measures of Independence for non-iid Data
Many machine learning algorithms can be formulated in the framework of statistical independence such as the Hilbert Schmidt Independence Criterion. In this paper, we extend this criterion to deal with structured and interdependent observations. This is achieved by modeling the structures using undirected graphical models and comparing the Hilbert space embeddings of distributions. We apply this...
متن کامل